14,725 research outputs found

    Optimisation of out-vessel magnetic diagnostics for plasma boundary reconstruction in tokamaks

    Full text link
    To improve the low frequency spectrum of magnetic field measurements of future tokamak reactors such as ITER, several steady state magnetic sensor technologies have been considered. For all the studied technologies it is always advantageous to place the sensors outside the vacuum vessel and as far away from the reactor core to minimize radiation damage and temperature effects, but not so far as to compromise the accuracy of the equilibrium reconstruction. We have studied to what extent increasing the distance between out-vessel sensors and plasma can be compensated for sensor accuracy and/or density before the limit imposed by the degeneracy of the problem is reached. The study is particularized for the Swiss TCV tokamak, due to the quality of its magnetic data and its ability to operate with a wide range of plasma shapes and divertor configurations. We have scanned the plasma boundary reconstruction error as function of out-vessel sensor density, accuracy and distance to the plasma. The study is performed for both the transient and steady state phases of the tokamak discharge. We find that, in general, there is a broad region in the parameter space where sensor accuracy, density and proximity to the plasma can be traded for one another to obtain a desired level of accuracy in the reconstructed boundary, up to some limit. Extrapolation of the results to a tokamak reactor suggests that a hybrid configuration with sensors inside and outside the vacuum vessel could be used to obtain a good boundary reconstruction during both the transient and the flat-top of the discharges, if out-vessel magnetic sensors of sufficient density and accuracy can be placed sufficiently far outside the vessel to minimize radiation damage.Comment: 36 pages, 17 figures, Accepted for publication in Nuclear Fusio

    Good Times Are Drinking Times: Empirical Evidence on Business Cycles an Alcohol Sales in Sweden 1861-2000

    Get PDF
    This paper studies the relationship between the business cycle and alcohol sales in Sweden using a data set for the years 1861-2000. Using wavelet based band-pass filtering it is found that there is a pro-cyclical relationship, i.e. alcohol sales increases in short-term economic upturns. Using moving window techniques we see that the pro-cyclical relationship holds over the entire time period. We also find that alcohol sales are a long-memory process with non-stationary behavior, i.e. a shock in alcohol sales has persistent effectsBusinesscycles:Alcohol:Sweden

    The Impact of Real Options on Willingness to Pay for Investments in Road Safety

    Get PDF
    Abstract: Public investments are dynamic in nature, and decision making must account for the uncertainty, irreversibility and potential for future learning. In this paper we adapt the theory for investment under uncertainty for a public referendum setting and perform the first empirical test to show that estimates of the value of safety (VSL) from stated preference surveys are highly dependent on the inclusion of the option value. Our results indicate an option value of a major economic magnitude. This implies that previously reported VSL estimates are exaggerated.Value of a Statistical Life; Real Options; Contingent Valuation; Road Safety

    Efficiency of Truthful and Symmetric Mechanisms in One-sided Matching

    Full text link
    We study the efficiency (in terms of social welfare) of truthful and symmetric mechanisms in one-sided matching problems with {\em dichotomous preferences} and {\em normalized von Neumann-Morgenstern preferences}. We are particularly interested in the well-known {\em Random Serial Dictatorship} mechanism. For dichotomous preferences, we first show that truthful, symmetric and optimal mechanisms exist if intractable mechanisms are allowed. We then provide a connection to online bipartite matching. Using this connection, it is possible to design truthful, symmetric and tractable mechanisms that extract 0.69 of the maximum social welfare, which works under assumption that agents are not adversarial. Without this assumption, we show that Random Serial Dictatorship always returns an assignment in which the expected social welfare is at least a third of the maximum social welfare. For normalized von Neumann-Morgenstern preferences, we show that Random Serial Dictatorship always returns an assignment in which the expected social welfare is at least \frac{1}{e}\frac{\nu(\opt)^2}{n}, where \nu(\opt) is the maximum social welfare and nn is the number of both agents and items. On the hardness side, we show that no truthful mechanism can achieve a social welfare better than \frac{\nu(\opt)^2}{n}.Comment: 13 pages, 1 figur

    Can the cosmic x ray and gamma ray background be due to reflection of a steep power law spectrum and Compton scattering by relativistic electrons?

    Get PDF
    We reconsider the recent model for the origin in the cosmic X-ray and gamma-ray background by Rogers and Field. The background in the model is due to an unresolved population of AGNs. An individual AGN spectrum contains three components: a power law with the energy index of alpha = 1.1, an enhanced reflection component, and a component from Compton scattering by relativistic electrons with a low energy cutoff at some minimum Lorentz factor, gamma(sub min) much greater than 1. The MeV bump seen in the gamma-ray background is then explained by inverse Compton emission by the electrons. We show that the model does not reproduce the shape of the observed X-ray and gamma-ray background below 10 MeV and that it overproduces the background at larger energies. Furthermore, we find the assumptions made for the Compton component to be physically inconsistent. Relaxing the inconsistent assumptions leads to model spectra even more different from that of the observed cosmic background. Thus, we can reject the hypothesis that the high-energy cosmic background is due to the described model

    Eulerian and modified Lagrangian approaches to multi-dimensional condensation and collection

    Full text link
    Turbulence is argued to play a crucial role in cloud droplet growth. The combined problem of turbulence and cloud droplet growth is numerically challenging. Here, an Eulerian scheme based on the Smoluchowski equation is compared with two Lagrangian superparticle (or su- perdroplet) schemes in the presence of condensation and collection. The growth processes are studied either separately or in combination using either two-dimensional turbulence, a steady flow, or just gravitational acceleration without gas flow. Good agreement between the differ- ent schemes for the time evolution of the size spectra is observed in the presence of gravity or turbulence. Higher moments of the size spectra are found to be a useful tool to characterize the growth of the largest drops through collection. Remarkably, the tails of the size spectra are reasonably well described by a gamma distribution in cases with gravity or turbulence. The Lagrangian schemes are generally found to be superior over the Eulerian one in terms of computational performance. However, it is shown that the use of interpolation schemes such as the cloud-in-cell algorithm is detrimental in connection with superparticle or superdroplet approaches. Furthermore, the use of symmetric over asymmetric collection schemes is shown to reduce the amount of scatter in the results.Comment: 36 pages, 17 figure

    Gaussian process tomography for soft x-ray spectroscopy at WEST without equilibrium information

    Get PDF
    International audienceGaussian process tomography (GPT) is a recently developed tomography method based on the Bayesian probability theory [J. Svensson, JET Internal Report EFDA-JET-PR(11)24, 2011 and Li et al., Rev. Sci. Instrum. 84, 083506 (2013)]. By modeling the soft X-ray (SXR) emissivity field in a poloidal cross section as a Gaussian process, the Bayesian SXR tomography can be carried out in a robust and extremely fast way. Owing to the short execution time of the algorithm, GPT is an important candidate for providing real-time reconstructions with a view to impurity transport and fast magnetohydrodynamic control. In addition, the Bayesian formalism allows quantifying uncertainty on the inferred parameters. In this paper, the GPT technique is validated using a synthetic data set expected from the WEST tokamak, and the results are shown of its application to the reconstruction of SXR emissivity profiles measured on Tore Supra. The method is compared with the standard algorithm based on minimization of the Fisher information

    Quasi-thermal Comptonization and gamma-ray bursts

    Get PDF
    Quasi-thermal Comptonization in internal shocks formed between relativistic shells can account for the high energy emission of gamma-ray bursts. This is in fact the dominant cooling mechanism if the typical energy of the emitting particles is achieved either through the balance between heating and cooling or as a result of electron-positron pair production. Both processes yield sub or mildly relativistic energies. In this case the synchrotron spectrum is self-absorbed, providing the seed soft photons for the Comptonization process, whose spectrum is flat [F(v) ~ const], ending either in an exponential cutoff or a Wien peak, depending on the scattering optical depth of the emitting particles. Self-consistent particle energy and optical depth are estimated and found in agreement with the observed spectra.Comment: 10 pages, ApJ Letters, accepted for publicatio
    • …
    corecore